3,103 research outputs found
Gallery of Planetary Nebula Spectra
We present the Gallery of Planetary Nebula Spectra now available at
http://oit.williams.edu/nebulae The website offers high-quality, moderate
resolution (~7-10 A FWHM) spectra of 128 Galactic planetary nebulae from
3600-9600 A, obtained by Kwitter, Henry, and colleagues with the Goldcam
spectrograph at the KPNO 2.1-m or with the RC spectrograph at the CTIO 1.5-m.
The master PN table contains atlas data and an image link. A selected object's
spectrum is displayed in a zoomable window; line identification templates are
provided. In addition to the spectra themselves, the website also contains a
brief discussion of PNe as astronomical objects and as contributors to our
understanding of stellar evolution. We envision that this website, which
concentrates a large amount of data in one place, will be of interest to a
variety of users: researchers might need to check the spectrum of a particular
object of interest; the non-specialist astronomer might simply be interested in
perusing such a collection of spectra; and finally, teachers of introductory
astronomy can use this database to illustrate basic principles of atomic
physics and radiation. To particularly encourage this last use, we have
developed two paper-and-pencil exercises to introduce beginning astronomy
students to the wealth of information that PN spectra contain.Comment: Two pages, two figures. Contributed paper to IAU Symp. 234,
``Planetary Nebulae in our Galaxy and Beyond.'
An investigation of a pattern recognition system to analyse and classify dried fruit
Includes bibliographical references.Both the declining cost and increasing capabilities of specialised computer hardware for image processing have enabled computer vision systems to become a viable alternative to human visual inspection in industrial applications. In this thesis a vision system that will analyse and classify dried fruit is investigated. In human visual inspection of dried fruit, the colour of the fruit is often the main determinant of its grade; in specific cases the presence of blemishes and geometrical fault are also incorporated in order to determine the fruit grade. A colour model that would successfully represent the colour variations within dried fruit grades, was investigated. The selected colour feature space formed the basis of a classification system which automatically allocated a sample unit of dried fruit to one specific grade. Various classification methods were investigated, and that which suited the system data and parameters was selected and evaluated using test sets of three types of dried fruit. In order to successfully grade dried fruit, a number of additional problems had to be catered for: the red/brown coloured central core area of dried peaches had to be removed from the colour analysis, and Black blemishes upon dried pears had to be isolated and sized in order to supplement the colour classifier in the final classification of the pear. The core area of a dried peach was isolated using the Morphological Top-Hat transform, and Black blemishes upon pears were isolated using colour histogram thresholding techniques. The test results indicated that although colour classification was the major determinant in the grading of dried fruit, other characteristics of the fruit had to be incorporated to achieve successful final classification results; these characteristics may be different for different types of dried fruit, but in the case of dried apricots, dried peaches and dried pears, they include the: peach core area removal, fruit geometry validation, and dried pear blemish isolation and sizing
A prototype for large site analysis : 1000 acres in Spotsylvania, Virginia
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Urban Studies and Planning, 1987.Bibliography: leaves 110-112.by Henry G. Brauer and Karen Kelsey.M.S
Deep Haptic Model Predictive Control for Robot-Assisted Dressing
Robot-assisted dressing offers an opportunity to benefit the lives of many
people with disabilities, such as some older adults. However, robots currently
lack common sense about the physical implications of their actions on people.
The physical implications of dressing are complicated by non-rigid garments,
which can result in a robot indirectly applying high forces to a person's body.
We present a deep recurrent model that, when given a proposed action by the
robot, predicts the forces a garment will apply to a person's body. We also
show that a robot can provide better dressing assistance by using this model
with model predictive control. The predictions made by our model only use
haptic and kinematic observations from the robot's end effector, which are
readily attainable. Collecting training data from real world physical
human-robot interaction can be time consuming, costly, and put people at risk.
Instead, we train our predictive model using data collected in an entirely
self-supervised fashion from a physics-based simulation. We evaluated our
approach with a PR2 robot that attempted to pull a hospital gown onto the arms
of 10 human participants. With a 0.2s prediction horizon, our controller
succeeded at high rates and lowered applied force while navigating the garment
around a persons fist and elbow without getting caught. Shorter prediction
horizons resulted in significantly reduced performance with the sleeve catching
on the participants' fists and elbows, demonstrating the value of our model's
predictions. These behaviors of mitigating catches emerged from our deep
predictive model and the controller objective function, which primarily
penalizes high forces.Comment: 8 pages, 12 figures, 1 table, 2018 IEEE International Conference on
Robotics and Automation (ICRA
Abundances of PNe in the Outer Disk of M31
We present spectroscopic observations and chemical abundances of 16 planetary
nebulae (PNe) in the outer disk of M31. The [O III] 4363 line is detected in
all objects, allowing a direct measurement of the nebular temperature essential
for accurate abundance determinations. Our results show that the abundances in
these M31 PNe display the same correlations and general behaviors as Type II
PNe in the Milky Way Galaxy. We also calculate photoionization models to derive
estimates of central star properties. From these we infer that our sample PNe,
all near the peak of the Planetary Nebula Luminosity Function, originated from
stars near 2 M_sun. Finally, under the assumption that these PNe are located in
M31's disk, we plot the oxygen abundance gradient, which appears shallower than
the gradient in the Milky Way.Comment: 48 pages, including 12 figures and 8 tables, accepted by
Astrophysical Journa
Multidimensional Capacitive Sensing for Robot-Assisted Dressing and Bathing
Robotic assistance presents an opportunity to benefit the lives of many
people with physical disabilities, yet accurately sensing the human body and
tracking human motion remain difficult for robots. We present a
multidimensional capacitive sensing technique that estimates the local pose of
a human limb in real time. A key benefit of this sensing method is that it can
sense the limb through opaque materials, including fabrics and wet cloth. Our
method uses a multielectrode capacitive sensor mounted to a robot's end
effector. A neural network model estimates the position of the closest point on
a person's limb and the orientation of the limb's central axis relative to the
sensor's frame of reference. These pose estimates enable the robot to move its
end effector with respect to the limb using feedback control. We demonstrate
that a PR2 robot can use this approach with a custom six electrode capacitive
sensor to assist with two activities of daily living-dressing and bathing. The
robot pulled the sleeve of a hospital gown onto able-bodied participants' right
arms, while tracking human motion. When assisting with bathing, the robot moved
a soft wet washcloth to follow the contours of able-bodied participants' limbs,
cleaning their surfaces. Overall, we found that multidimensional capacitive
sensing presents a promising approach for robots to sense and track the human
body during assistive tasks that require physical human-robot interaction.Comment: 8 pages, 16 figures, International Conference on Rehabilitation
Robotics 201
- âŠ